4,481 research outputs found

    Two for the Price of One: Lifting Separation Logic Assertions

    Full text link
    Recently, data abstraction has been studied in the context of separation logic, with noticeable practical successes: the developed logics have enabled clean proofs of tricky challenging programs, such as subject-observer patterns, and they have become the basis of efficient verification tools for Java (jStar), C (VeriFast) and Hoare Type Theory (Ynot). In this paper, we give a new semantic analysis of such logic-based approaches using Reynolds's relational parametricity. The core of the analysis is our lifting theorems, which give a sound and complete condition for when a true implication between assertions in the standard interpretation entails that the same implication holds in a relational interpretation. Using these theorems, we provide an algorithm for identifying abstraction-respecting client-side proofs; the proofs ensure that clients cannot distinguish two appropriately-related module implementations

    Programmatic and Direct Manipulation, Together at Last

    Full text link
    Direct manipulation interfaces and programmatic systems have distinct and complementary strengths. The former provide intuitive, immediate visual feedback and enable rapid prototyping, whereas the latter enable complex, reusable abstractions. Unfortunately, existing systems typically force users into just one of these two interaction modes. We present a system called Sketch-n-Sketch that integrates programmatic and direct manipulation for the particular domain of Scalable Vector Graphics (SVG). In Sketch-n-Sketch, the user writes a program to generate an output SVG canvas. Then the user may directly manipulate the canvas while the system immediately infers a program update in order to match the changes to the output, a workflow we call live synchronization. To achieve this, we propose (i) a technique called trace-based program synthesis that takes program execution history into account in order to constrain the search space and (ii) heuristics for dealing with ambiguities. Based on our experience with examples spanning 2,000 lines of code and from the results of a preliminary user study, we believe that Sketch-n-Sketch provides a novel workflow that can augment traditional programming systems. Our approach may serve as the basis for live synchronization in other application domains, as well as a starting point for yet more ambitious ways of combining programmatic and direct manipulation.Comment: PLDI 2016 Paper + Supplementary Appendice

    Towards A Graphene Chip System For Blood Clotting Disease Diagnostics

    Get PDF
    Point of care diagnostics (POCD) allows the rapid, accurate measurement of analytes near to a patient. This enables faster clinical decision making and can lead to earlier diagnosis and better patient monitoring and treatment. However, despite many prospective POCD devices being developed for a wide range of diseases this promised technology is yet to be translated to a clinical setting due to the lack of a cost-effective biosensing platform.This thesis focuses on the development of a highly sensitive, low cost and scalable biosensor platform that combines graphene with semiconductor fabrication tech-niques to create graphene field-effect transistors biosensor. The key challenges of designing and fabricating a graphene-based biosensor are addressed. This work fo-cuses on a specific platform for blood clotting disease diagnostics, but the platform has the capability of being applied to any disease with a detectable biomarker.Multiple sensor designs were tested during this work that maximised sensor ef-ficiency and costs for different applications. The multiplex design enabled different graphene channels on the same chip to be functionalised with unique chemistry. The Inverted MOSFET design was created, which allows for back gated measurements to be performed whilst keeping the graphene channel open for functionalisation. The Shared Source and Matrix design maximises the total number of sensing channels per chip, resulting in the most cost-effective fabrication approach for a graphene-based sensor (decreasing cost per channel from £9.72 to £4.11).The challenge of integrating graphene into a semiconductor fabrication process is also addressed through the development of a novel vacuum transfer method-ology that allows photoresist free transfer. The two main fabrication processes; graphene supplied on the wafer “Pre-Transfer” and graphene transferred after met-allisation “Post-Transfer” were compared in terms of graphene channel resistance and graphene end quality (defect density and photoresist). The Post-Transfer pro-cess higher quality (less damage, residue and doping, confirmed by Raman spec-troscopy).Following sensor fabrication, the next stages of creating a sensor platform involve the passivation and packaging of the sensor chip. Different approaches using dielec-tric deposition approaches are compared for passivation. Molecular Vapour Deposi-tion (MVD) deposited Al2O3 was shown to produce graphene channels with lower damage than unprocessed graphene, and also improves graphene doping bringing the Dirac point of the graphene close to 0 V. The packaging integration of microfluidics is investigated comparing traditional soft lithography approaches and the new 3D printed microfluidic approach. Specific microfluidic packaging for blood separation towards a blood sampling point of care sensor is examined to identify the laminar approach for lower blood cell count, as a method of pre-processing the blood sample before sensing.To test the sensitivity of the Post-Transfer MVD passivated graphene sensor de-veloped in this work, real-time IV measurements were performed to identify throm-bin protein binding in real-time on the graphene surface. The sensor was function-alised using a thrombin specific aptamer solution and real-time IV measurements were performed on the functionalised graphene sensor with a range of biologically relevant protein concentrations. The resulting sensitivity of the graphene sensor was in the 1-100 pg/ml concentration range, producing a resistance change of 0.2% per pg/ml. Specificity was confirmed using a non-thrombin specific aptamer as the neg-ative control. These results indicate that the graphene sensor platform developed in this thesis has the potential as a highly sensitive POCD. The processes developed here can be used to develop graphene sensors for multiple biomarkers in the future

    Bayard Rustin: The Intersection of Sexuality and Civil Rights

    Get PDF

    Development of Graphene-Filled Fluoropolymer Composite Coatings for Condensing Heat Exchangers

    Get PDF
    Low-temperature waste heat recovery employs condensing heat exchangers to recover both sensible and latent heats. Due to the condensation of flue gases within these heat exchangers, they are subjected to severe corrosion. Perfluoroalkoxy (PFA) has been applied as a barrier layer on the surfaces of these heat exchangers for the prevention of corrosion. However, PFA has exhibited poor thermal properties and durability, which are requirements in low-temperature heat recovery applications. In this thesis, carbon-based nano-materials (8 nm and 60 nm thickness graphene particles and multi-walled carbon nanotubes, MWCNT) were incorporated into fluoropolymer (PFA) powders to generate thermally conductive and corrosion resistant composites as heat exchanger coatings. The microstructure, thermal conductivity, electrical conductivity of these composites were characterized. It was found that the thermal conductivities of the graphene-filled composites were significantly higher than that of the virgin PFA, i.e. approximately 8 times, while the composites containing MWCNT particles exhibited minimal improvement in thermal properties. The coatings containing both grades of graphene exhibited good surface finish and coating adhesion, good wear resistance and excellent corrosion resistance. The MWCNT-filled composites showed poor surface finish, resulting in poor corrosion resistance

    ON THE DISTRIBUTION OF THE NUMBER OF PRIME FACTORS OF AN INTEGER

    Get PDF
    The distribution of the prime numbers has intrigued number theorists for centuries. As our understanding of this distribution has evolved, so too have our methods of analyzing the related arithmetic functions. If we let ω(n) denote the number of distinct prime divisors of a natural number n, then the celebrated Erdős –Kac Theorem states that the values of ω(n) are normally distributed (satisfying a central limit theorem as n varies). This result is considered the beginning of Probabilistic Number Theory. We present a modern proof of the Erdős–Kac Theorem using a moment based argument due to Granville and Soundararajan, which we explain in full detail. We also use similar techniques to study the second moment of ω(n), refining a classical result of Turán

    Score study procedures and processes among instrumental music teachers and students of varying experience.

    Get PDF
    Score study is agreed to be an essential part of musical preparation by professional conductors. However, no single method for score study has been established. This thesis examines the score study habits of undergraduate music majors, graduate students with a minimum of five years of instrumental teaching experience, and highly-qualified instrumental music teachers with a minimum of ten years of experience. Qualitative methods are employed, and interview transcriptions are the primary sources of data. Findings showed that instrumental music teachers and those with instrumental teaching experience base many of their score study decisions on the abilities and needs of students and/or performers. Implications for conducting teachers are suggested that may improve the score study methods of students interested in teaching instrumental music
    corecore